Skip to content

MkLlm

Show source on GitHub

Node for LLM-based text generation.

Example: Regular

Jinja

{{ "Write a poem about MkDocs" | MkLlm(model="simple-openai:gpt-4o-mini") }}

Python

MkLlm('Write a poem about MkDocs', 'simple-openai:gpt-4o-mini', [])

In the realm where knowledge flows,
A treasure trove, where wisdom grows,
MkDocs stands, a beacon bright,
Crafting docs in pure delight.

With simple syntax, clean and clear,
Markdown whispers, we draw near,
Each page a canvas, waiting to bloom,
Unfolding stories, dispelling gloom.

From code to prose, it brings to life,
A seamless dance amid the strife,
With themes that shimmer, vibrant and bold,
A tapestry of knowledge, waiting to unfold.

Navigation sleek, with ease we roam,
Through chapters vast, it feels like home,
Links that guide like constellations in night,
Each click a journey, each scroll a flight.

Version control, it guards the past,
Preserving wisdom, forever to last,
A community united, sharing their art,
In every line, a piece of the heart.

So here's to MkDocs, our guiding star,
For every developer, near and far,
In this digital age, where info is king,
With every build, new visions take wing.

In the realm where knowledge flows,  
A treasure trove, where wisdom grows,  
MkDocs stands, a beacon bright,  
Crafting docs in pure delight.  

With simple syntax, clean and clear,  
Markdown whispers, we draw near,  
Each page a canvas, waiting to bloom,  
Unfolding stories, dispelling gloom.  

From code to prose, it brings to life,  
A seamless dance amid the strife,  
With themes that shimmer, vibrant and bold,  
A tapestry of knowledge, waiting to unfold.  

Navigation sleek, with ease we roam,  
Through chapters vast, it feels like home,  
Links that guide like constellations in night,  
Each click a journey, each scroll a flight.  

Version control, it guards the past,  
Preserving wisdom, forever to last,  
A community united, sharing their art,  
In every line, a piece of the heart.  

So here's to MkDocs, our guiding star,  
For every developer, near and far,  
In this digital age, where info is king,  
With every build, new visions take wing.  
<p>In the realm where knowledge flows,<br>
A treasure trove, where wisdom grows,<br>
MkDocs stands, a beacon bright,<br>
Crafting docs in pure delight.  </p>
<p>With simple syntax, clean and clear,<br>
Markdown whispers, we draw near,<br>
Each page a canvas, waiting to bloom,<br>
Unfolding stories, dispelling gloom.  </p>
<p>From code to prose, it brings to life,<br>
A seamless dance amid the strife,<br>
With themes that shimmer, vibrant and bold,<br>
A tapestry of knowledge, waiting to unfold.  </p>
<p>Navigation sleek, with ease we roam,<br>
Through chapters vast, it feels like home,<br>
Links that guide like constellations in night,<br>
Each click a journey, each scroll a flight.  </p>
<p>Version control, it guards the past,<br>
Preserving wisdom, forever to last,<br>
A community united, sharing their art,<br>
In every line, a piece of the heart.  </p>
<p>So here's to MkDocs, our guiding star,<br>
For every developer, near and far,<br>
In this digital age, where info is king,<br>
With every build, new visions take wing.  </p>

Bases: MkText

text property

text: str

__init__

__init__(
    user_prompt: str,
    system_prompt: str | None = None,
    model: str = "openai:gpt-4o-mini",
    context: str | None = None,
    extra_files: Sequence[str | PathLike[str]] | None = None,
    **kwargs: Any
)

Parameters:

Name Type Description Default
user_prompt str

Main prompt for the LLM

required
system_prompt str | None

System prompt to set LLM behavior

None
model str

LLM model identifier to use

'openai:gpt-4o-mini'
context str | None

Main context string

None
extra_files Sequence[str | PathLike[str]] | None

Additional context files or strings

None
kwargs Any

Keyword arguments passed to parent

{}

_process_extra_files

_process_extra_files() -> list[str]
Name Children Inherits
MkText
mknodes.basenodes.mktext
Class for any Markup text.
graph TD
  94875593404928["mkllm.MkLlm"]
  94875593471104["mktext.MkText"]
  94875594508576["mknode.MkNode"]
  94875592833088["node.Node"]
  140216716431552["builtins.object"]
  94875593471104 --> 94875593404928
  94875594508576 --> 94875593471104
  94875592833088 --> 94875594508576
  140216716431552 --> 94875592833088
/home/runner/work/mknodes/mknodes/mknodes/templatenodes/mkllm/metadata.toml
[metadata]
icon = "mdi:view-grid"
status = "new"
name = "MkLlm"

[examples.regular]
title = "Regular"
jinja = """
{{ "Write a poem about MkDocs" | MkLlm(model="simple-openai:gpt-4o-mini") }}
"""

# [output.markdown]
# template = """
# <div class="grid cards" markdown="1">

# {% for item in node.items %}
# -   {{ item | indent }}
# {% endfor %}
# </div>
# """
mknodes.templatenodes.mkllm.MkLlm
class MkLlm(mktext.MkText):
    """Node for LLM-based text generation."""

    ICON = "material/format-list-group"
    REQUIRED_PACKAGES = [resources.Package("litellm")]

    def __init__(
        self,
        user_prompt: str,
        system_prompt: str | None = None,
        model: str = "openai:gpt-4o-mini",
        context: str | None = None,
        extra_files: Sequence[str | os.PathLike[str]] | None = None,
        **kwargs: Any,
    ):
        """Constructor.

        Args:
            user_prompt: Main prompt for the LLM
            system_prompt: System prompt to set LLM behavior
            model: LLM model identifier to use
            context: Main context string
            extra_files: Additional context files or strings
            kwargs: Keyword arguments passed to parent
        """
        super().__init__(**kwargs)
        self.user_prompt = user_prompt
        self.system_prompt = system_prompt
        self._model = model
        self._context = context
        self._extra_files = extra_files or []

    def _process_extra_files(self) -> list[str]:
        """Process extra context items, reading files if necessary.

        Returns:
            List of context strings.
        """
        context_items: list[str] = []

        def process_dir(path: UPath) -> list[str]:
            return [f.read_text() for f in path.rglob("*") if f.is_file()]

        for item in self._extra_files:
            try:
                path = UPath(item)
                if path.is_file():
                    context_items.append(path.read_text())
                elif path.is_dir():
                    context_items.extend(process_dir(path))
                else:
                    context_items.append(str(item))
            except Exception as exc:
                err_msg = f"Failed to read context file: {item}"
                logger.warning(err_msg)
                raise ValueError(err_msg) from exc

        return context_items

    @property
    def text(self) -> str:
        """Generate text using the LLM.

        Returns:
            Generated text content.
        """
        context_items = self._process_extra_files()
        combined_context = (
            "\n".join(filter(None, [self._context, *context_items])) or None
        )

        return complete_llm(
            self.user_prompt,
            self.system_prompt or "",
            model=self._model,
            context=combined_context or "",
        )